Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Sci Rep ; 14(1): 7819, 2024 04 03.
Artigo em Inglês | MEDLINE | ID: mdl-38570582

RESUMO

Heart disease is a leading cause of mortality on a global scale. Accurately predicting cardiovascular disease poses a significant challenge within clinical data analysis. The present study introduces a prediction model that utilizes various combinations of information and employs multiple established classification approaches. The proposed technique combines the genetic algorithm (GA) and the recursive feature elimination method (RFEM) to select relevant features, thus enhancing the model's robustness. Techniques like the under sampling clustering oversampling method (USCOM) address the issue of data imbalance, thereby improving the model's predictive capabilities. The classification challenge employs a multilayer deep convolutional neural network (MLDCNN), trained using the adaptive elephant herd optimization method (AEHOM). The proposed machine learning-based heart disease prediction method (ML-HDPM) demonstrates outstanding performance across various crucial evaluation parameters, as indicated by its comprehensive assessment. During the training process, the ML-HDPM model exhibits a high level of performance, achieving an accuracy rate of 95.5% and a precision rate of 94.8%. The system's sensitivity (recall) performs with a high accuracy rate of 96.2%, while the F-score highlights its well-balanced performance, measuring 91.5%. It is worth noting that the specificity of ML-HDPM is recorded at a remarkable 89.7%. The findings underscore the potential of ML-HDPM to transform the prediction of heart disease and aid healthcare practitioners in providing precise diagnoses, exerting a substantial influence on patient care outcomes.


Assuntos
Doenças Cardiovasculares , Cardiopatias , Mamífero Proboscídeo , Humanos , Animais , Cardiopatias/diagnóstico , Doenças Cardiovasculares/diagnóstico , Análise por Conglomerados , Análise de Dados , Aprendizado de Máquina
2.
Sci Rep ; 13(1): 1004, 2023 01 18.
Artigo em Inglês | MEDLINE | ID: mdl-36653424

RESUMO

Industrial Internet of Things (IIoT)-based systems have become an important part of industry consortium systems because of their rapid growth and wide-ranging application. Various physical objects that are interconnected in the IIoT network communicate with each other and simplify the process of decision-making by observing and analyzing the surrounding environment. While making such intelligent decisions, devices need to transfer and communicate data with each other. However, as devices involved in IIoT networks grow and the methods of connections diversify, the traditional security frameworks face many shortcomings, including vulnerabilities to attack, lags in data, sharing data, and lack of proper authentication. Blockchain technology has the potential to empower safe data distribution of big data generated by the IIoT. Prevailing data-sharing methods in blockchain only concentrate on the data interchanging among parties, not on the efficiency in sharing, and storing. Hence an element-based K-harmonic means clustering algorithm (CA) is proposed for the effective sharing of data among the entities along with an algorithm named underweight data block (UDB) for overcoming the obstacle of storage space. The performance metrics considered for the evaluation of the proposed framework are the sum of squared error (SSE), time complexity with respect to different m values, and storage complexity with CPU utilization. The results have experimented with MATLAB 2018a simulation environment. The proposed model has better sharing, and storing based on blockchain technology, which is appropriate IIoT.


Assuntos
Blockchain , Indústrias , Algoritmos , Benchmarking , Big Data , Segurança Computacional
3.
PeerJ Comput Sci ; 8: e1040, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-35875649

RESUMO

In the recent research era, artificial intelligence techniques have been used for computer vision, big data analysis, and detection systems. The development of these advanced technologies has also increased security and privacy issues. One kind of this issue is Deepfakes which is the combined word of deep learning and fake. DeepFake refers to the formation of a fake image or video using artificial intelligence approaches which are created for political abuse, fake data transfer, and pornography. This paper has developed a Deepfake detection method by examining the computer vision features of the digital content. The computer vision features based on the frame change are extracted using a proposed deep learning model called the Cascaded Deep Sparse Auto Encoder (CDSAE) trained by temporal CNN. The detection process is performed using a Deep Neural Network (DNN) to classify the deep fake image/video from the real image/video. The proposed model is implemented using Face2Face, FaceSwap, and DFDC datasets which have secured an improved detection rate when compared to the traditional deep fake detection approaches.

4.
PeerJ Comput Sci ; 7: e569, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-34401472

RESUMO

Cloud computing is one of the evolving fields of technology, which allows storage, access of data, programs, and their execution over the internet with offering a variety of information related services. With cloud information services, it is essential for information to be saved securely and to be distributed safely across numerous users. Cloud information storage has suffered from issues related to information integrity, data security, and information access by unauthenticated users. The distribution and storage of data among several users are highly scalable and cost-efficient but results in data redundancy and security issues. In this article, a biometric authentication scheme is proposed for the requested users to give access permission in a cloud-distributed environment and, at the same time, alleviate data redundancy. To achieve this, a cryptographic technique is used by service providers to generate the bio-key for authentication, which will be accessible only to authenticated users. A Gabor filter with distributed security and encryption using XOR operations is used to generate the proposed bio-key (biometric generated key) and avoid data deduplication in the cloud, ensuring avoidance of data redundancy and security. The proposed method is compared with existing algorithms, such as convergent encryption (CE), leakage resilient (LR), randomized convergent encryption (RCE), secure de-duplication scheme (SDS), to evaluate the de-duplication performance. Our comparative analysis shows that our proposed scheme results in smaller computation and communication costs than existing schemes.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...